Primal explicit max margin feature selection for nonlinear support vector machines
نویسندگان
چکیده
Embedding feature selection in nonlinear SVMs leads to a challenging non-convex minimization problem, which can be prone to suboptimal solutions. This paper develops an effective algorithm to directly solve the embedded feature selection primal problem. We use a trust-region method, which is better suited for non-convex optimization compared to line-search methods, and guarantees convergence to a minimizer. We devise an alternating optimization approach to tackle the problem efficiently, breaking it down into a convex subproblem, corresponding to standard SVM optimization, and a non-convex subproblem for feature selection. Importantly, we show that a straightforward alternating optimization approach can be susceptible to saddle point solutions. We propose a novel technique, which shares an explicit margin variable to overcome saddle point convergence and improve solution quality. Experiment results show our method outperforms the state-of-the-art embedded SVM feature selection method, as well as other leading filter and wrapper approaches.
منابع مشابه
Margin-based Feature Selection Techniques for Support Vector Machine Classification
Feature selection for classification working in high-dimensional feature spaces can improve generalization accuracy, reduce classifier complexity, and is also useful for identifying the important feature “markers”, e.g., biomarkers in a bioinformatics or biomedical context. For support vector machine (SVM) classification, a widely used feature selection technique is recursive feature eliminatio...
متن کاملLeast Squares Support Vector Machines and Primal Space Estimation
In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping ...
متن کاملExplicit Max Margin Input Feature Selection for Nonlinear SVM using Second Order Methods
Incorporating feature selection in nonlinear SVMs leads to a large and challenging nonconvex minimization problem, which can be prone to suboptimal solutions. We use a second order optimization method that utilizes eigenvalue information and is less likely to get stuck at suboptimal solutions. We devise an alternating optimization approach to tackle the problem efficiently, breaking it down int...
متن کاملMax-margin Multiple-Instance Learning via Semidefinite Programming
In this paper, we present a novel semidefinite programming approach for multiple-instance learning. We first formulate the multipleinstance learning as a combinatorial maximummargin optimization problem with additional instance selection constraints within the framework of support vector machines. Although solving this primal problem requires non-convex programming, we nevertheless can then der...
متن کاملMIT 9.520/6.860 Project: Feature selection for SVM
We consider sparse learning binary classification problems solved with linear support vector machines. We present two popular methods for variable selection: SVM-Recursive Feature algorithm and 1-norm SVM, and propose a third hybrid 1-norm RFE. Finally, we implement this three algorithms and compare their performances on synthetic and microarray datasets. 1-norm SVM gives the lowest test accura...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Pattern Recognition
دوره 47 شماره
صفحات -
تاریخ انتشار 2014